# Chinchilla scaling
Cerebras GPT 13B
Apache-2.0
Cerebras-GPT 13B is a large language model trained based on an open architecture and dataset. It belongs to the Cerebras-GPT series and aims to study the scaling laws of large language models and demonstrate the simplicity and scalability of training on the Cerebras software and hardware stack.
Large Language Model
Transformers English

C
cerebras
669
647
Cerebras GPT 2.7B
Apache-2.0
Cerebras-GPT 2.7B is a language model based on the Transformer architecture, aiming to support the research of large language models and can serve as a basic model in fields such as natural language processing.
Large Language Model
Transformers English

C
cerebras
269
44
Cerebras GPT 590M
Apache-2.0
Cerebras-GPT 590M is a language model based on the Transformer architecture and belongs to the Cerebras-GPT model family. It aims to study the scaling laws of large language models and demonstrate the simplicity and scalability of training large language models on the Cerebras software and hardware stack.
Large Language Model
Transformers English

C
cerebras
2,430
21
Featured Recommended AI Models